翻訳と辞書
Words near each other
・ Internet addiction disorder
・ Internet Adult Film Database
・ Internet Advertising Bureau
・ Internet Advisor
・ Internet America
・ Internet and Mobile Association of India
・ Internet and Society
・ Internet and Technology Law Desk Reference
・ Internet and terrorism
・ Internet appliance
・ Internet Application Management
・ Internet Archaeology
・ Internet Architecture Board
・ Internet Archive
・ Internet Archive's Children's Library
Internet area network
・ Internet art
・ Internet Article 23
・ Internet as a source of prior art
・ Internet Assigned Numbers Authority
・ Internet Association
・ Internet Australia
・ Internet Authentication Service
・ Internet backbone
・ Internet background noise
・ Internet band
・ Internet begging
・ Internet booking engine
・ Internet bot
・ Internet bottleneck


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Internet area network : ウィキペディア英語版
Internet area network

An Internet area network (IAN) is a concept for a communications network〔Winkleman, Roy. “Networking Handbook.” Florida Center for Instructional Technology College of Education. 2009-2013. http://fcit.usf.edu/network/chap1/chap1.htm〕 that connects voice and data endpoints within a cloud environment over IP, replacing an existing local area network (LAN), wide area network (WAN) or the public switched telephone network (PSTN).
Seen by proponents as the networking model of the future,〔iAreaNetwork Vision Statement. http://www.iareanet.com/about-the-cloud-company.html〕 an IAN securely connects endpoints through the public Web, so that they can communicate and exchange information and data without being tied to a physical location.
Unlike a LAN, which interconnects computers in a limited area such as a home, school, computer laboratory or office building using network media, or a WAN, which is a network that covers a broad area, such as any telecommunications network that links across metropolitan, regional, or national boundaries, using private or public network transports, the IAN eliminates a geographic profile for the network entirely because the applications and communications services have become virtualized. Endpoints need only be connected over a broadband connection across the Internet.
Hosted in the cloud by a managed services provider, an IAN platform offers users secure access to information from anywhere, at anytime, via an Internet connection. Users also have access to telephony, voicemail, e-mail, and fax services from any connected endpoint. For businesses, the hosted model reduces IT and communications expenses, protects against loss of data and disaster downtime, while realizing a greater return on their invested resources through increased employee productivity and reduction in telecom costs.
== History ==
The IAN is rooted in the rise of cloud computing, the underlying concept of which dates back to the 1950s; when large-scale mainframe became available in academia and corporations, accessible via thin clients / terminal computers.〔Martínez-Mateo, J., Munoz-Hernandez, S. and Pérez-Rey, D. “A Discussion of Thin Client Technology for Computer Labs.” University of Madrid. May 2010. http://www.researchgate.net/publication/45917654_A_Discussion_of_Thin_Client_Technology_for_Computer_Labs〕 Because it was costly to buy a mainframe, it became important to find ways to get the greatest return on the investment in them, allowing multiple users to share both the physical access to the computer from multiple terminals as well as to share the CPU time, eliminating periods of inactivity, which became known in the industry as time-sharing.〔 McCarthy, John. “Reminiscences on the History of Time Sharing.” Stanford University. 1983 Winter or Spring. http://www-formal.stanford.edu/jmc/history/timesharing/timesharing.html〕
The increasing demand and use of computers in universities and research labs in the late 1960s generated the need to provide high-speed interconnections between computer systems. A 1970 report from the Lawrence Radiation Laboratory detailing the growth of their "Octopus" network gave a good indication of the situation.〔Mendicino, Samuel. Computer Networks. 1972. pp 95-100. http://rogerdmoore.ca/PS/OCTOA/OCTO.html〕
As computers became more prevalent, scientists and technologists explored ways to make large-scale computing power available to more users through time sharing, experimenting with algorithms to provide the optimal use of the infrastructure, platform and applications with prioritized access to the CPU and efficiency for the end users.
John McCarthy opined in the 1960s that "computation may someday be organized as a public utility.".〔Garfinkle, Simson. “The Cloud Imperative.” MIT Technology Review. Oct. 3, 2011. http://www.technologyreview.com/news/425623/the-cloud-imperative/〕 Almost all the modern-day characteristics of cloud computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison to the electricity industry and the use of public, private, government, and community forms, were thoroughly explored in Douglas Parkhill's 1966 book, The Challenge of the Computer Utility.〔(The Challenge of the Computer Utility: Douglas F. Parkhill: 9780201057201: Amazon.com: Books ). Amazon.com. Retrieved on 2013-09-18.〕 Other scholars have shown that cloud computing's roots go all the way back to the 1950s〔Deboosere, L., De Wachter, J., Simoens, P., De Turck, F., Dhoedt, B., and Demeester, P. “Thin Client Computing Solutions in Low- and High-Motion Scenarios.” Third International Conference on Networking and Services (ICNS), 2007.〕 when scientist Herb Grosch (the author of Grosch's law) postulated that the entire world would operate on dumb terminals powered by about 15 large data centers.〔Gardner, W. David. “Author Of Grosch's Law Going Strong At 87.” InformationWeek. April 12, 2005. http://www.informationweek.com/author-of-groschs-law-going-strong-at-87/160701576〕 Due to the expense of these powerful computers, many corporations and other entities could avail themselves of computing capability through time sharing and several organizations, such as GE's GEISCO, IBM subsidiary The Service Bureau Corporation (SBC, founded in 1957), Tymshare (founded in 1966), National CSS (founded in 1967 and bought by Dun & Bradstreet in 1979), Dial Data (bought by Tymshare in 1968), and Bolt, Beranek and Newman (BBN) marketed time sharing as a commercial venture.〔
The development of the Internet from being document centric via semantic data towards more and more services was described as "Dynamic Web".〔“A History of the Dynamic Web.” Pingdom. Dec. 7. 2007. http://royal.pingdom.com/2007/12/07/a-history-of-the-dynamic-web/〕 This contribution focused in particular in the need for better meta-data able to describe not only implementation details but also conceptual details of model-based applications.
In the 1990s, telecommunications companies who previously offered primarily dedicated point-to-point data circuits, began offering virtual private network (VPN) services with comparable quality of service but at a much lower cost. By switching traffic to balance utilization as they saw fit, they were able to optimize their overall network usage.〔“Virtual Private Networks: Managing Telecom’s Golden Horde.” Billing World. May 1, 1999. http://www.billingworld.com/articles/1999/05/virtual-private-networks-managing-telecom-s-golde.aspx〕 The cloud symbol was used to denote the demarcation point between that which was the responsibility of the provider and that which was the responsibility of the users. Cloud computing extends this boundary to cover servers as well as the network infrastructure.
After the dot-com bubble, Amazon played a key role in the development of cloud computing by modernizing their data centers, which, like most computer networks, were using as little as 10% of their capacity at any one time, just to leave room for occasional spikes. Having found that the new cloud architecture resulted in significant internal efficiency improvements whereby small, fast-moving "two-pizza teams" (teams small enough to be fed with two pizzas〔Anders, George. “Inside Amazon's Idea Machine: How Bezos Decodes The Customer.” Forbes. April 2012〕) could add new features faster and more easily, Amazon initiated a new product development effort to provide cloud computing to external customers, and launched Amazon Web Services (AWS) on a utility computing basis in 2006.〔Arrington, Michael. “Interview with Jeff Bezos On Amazon Web Services.” TechCrunch, Nov. 14, 2006. http://techcrunch.com/2006/11/14/interview-with-jeff-bezos-on-amazon-web-services/〕
In early 2008, Eucalyptus became the first open-source, AWS API-compatible platform for deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission-funded project, became the first open-source software for deploying private and hybrid clouds, and for the federation of clouds.〔OpenNebula Website http://www.opennebula.org/start〕 In the same year, efforts were focused on providing quality of service guarantees (as required by real-time interactive applications) to cloud-based infrastructures, in the framework of the IRMOS European Commission-funded project, resulting to a real-time cloud environment.〔IRMOS Website http://www.irmosproject.eu/〕 By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them" and observed that "organizations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to computing... will result in dramatic growth in IT products in some areas and significant reductions in other areas."〔Plummer, Daryl. “Cloud Computing Confusion Leads to Opportunity.” Gartner Inc. June 2008〕
In 2011, RESERVOIR was established in Europe to create open source technologies, to allow cloud providers to build an advanced cloud with the balancing of workloads, lowering costs and moving workloads across geographic locations through a federation of clouds.〔RESERVOIR Website http://www.reservoir-fp7.eu/〕 Also in 2011, IBM announced the Smarter Computing framework to support a Smarter Planet.〔IBM Smarter Planet Home Page. http://www.ibm.com/smarter-computing/us/en/analytics-infrastructure/〕 Among the various components of the Smarter Computing foundation, cloud computing is a critical piece.
Now, the ubiquitous availability of high-capacity networks, low-cost computers and storage devices as well as the widespread adoption of hardware virtualization, service-oriented architecture, autonomic, and utility computing have led to a tremendous growth in cloud computing. Virtual worlds〔Naone, Erica. “Peer to Peer Virtual Worlds.” MIT Technology Review. April 16, 2008. http://www.technologyreview.com/news/409912/peer-to-peer-virtual-worlds/〕 and peer-to-peer architectures have paved the way for the concept of an IAN.
(iAreaNet ) was founded in 1999 by CEO James DeCrescenzo as a company called Internet Area Network, devoted to providing offsite data storage and disaster prevention before the cloud existed in widely deployed commercial form. It pioneered the idea of an IAN. Since then, it has strengthened operations and has made significant investments in developing a powerful infrastructure to provide businesses with an array of technology solutions, including the patent-pending iAreaOffice, which commercializes the concept of an IAN by eliminating the need for traditional LAN, WAN or telephone system for business communications.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Internet area network」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.